Recalibration of Neural Networks

An R package

Aluna: Carolina Musso

Surpevisor: Guilherme Souza Rodrigues

Departament of Statistics - UnB

Introduction: A bit of history

  • It is not an (entirely) new technique.

GOODFELLOW et al. (2016)

Introduction: Today

GOODFELLOW et al. (2016)

GOODFELLOW et al. (2016)

  • Dropout (2013)
  • Adam optimization algotithm (2014)
  • Batch Normalization (2015)

Introduction: Why they are good

GOODFELLOW et al. (2016)

GOODFELLOW et al. (2016)

Introduction: Problems

  • Low error is not the only asset a NN should have.

  • It should be able to quantify its uncertainty.

Calibration

  • A 95% Confidence Interval should contain 95% of the true output.

Introduction: Calibration

Formally:

\[\mathbb{P}(Y \leq \hat{F_Y}^{-1}(p))= p , \forall \in [0,1]\]

KULESHOV & DESHPANDE (2022)

  • Nowadays, NN are not so well calibrated.

Introduction: Recalibration

  • There are techniques such as:
    • TORRES (2023)

    • KULESHOV; FENNER; ERMON (2018)

    • KULESHOV; DESHPANDE (2021)

KULESHOV & DESHPANDE (2022)

Introduction: Motivation

  • However, it can be hard for the user to learn and implement these methods.

  • In this sense, packages may be a useful way to share these routines.

General objetive

  • Develop an R package with the three mentioned methods in order to make recalibration practices easier.

Specific Objectives

  • Organize the available code in a optimal and user-friendly way.

  • Ensure usability by following the best practices from the start.

  • Thoroughly document the functions/package.

  • Share the package with the community

    • GitHub , CRAN, paper.

Methods

Code

  • Apresentado por TORRES (2023).

Abordagem

  • Optimize and wrap the code in user-friendly functions:
    • Conventions WICKHAM; BRYAN (2015)
    • HardHat package VAUGHAN; KUHN (2023).
  • IDE RStudio, documentation with the package roxygen2 and vignettes/paper in .Rmd.

Methods

  • There are techniques such as: 1) TORRES (2023); 2) KULESHOV; FENNER; ERMON (2018); 3)KULESHOV; DESHPANDE (2021).

  • Evaluate calibration and perform recalibration.

Example

\[p_i = \hat{F_i}(y_i|x_i)\]

TORRES (2023)

Timeline

References

KULESHOV, V.; DESHPANDE, S. Calibrated and sharp uncertainties in deep learning via density estimation. International conference on machine learning. Anais...2021.
KULESHOV, V.; FENNER, N.; ERMON, S. Accurate uncertainties for deep learning using calibrated regression. International conference on machine learning. Anais...PMLR, 2018.
TORRES, R. Quantile-based Recalibration of Artificial Neural Networks. Master’s thesis—Distrito Federal, Brazil: University of Brasília, 2023.
VAUGHAN, D.; KUHN, M. Hardhat: Construct modeling packages. [s.l: s.n.].
WICKHAM, H.; BRYAN, J. R packages: Organize, test, document, and share your code. [s.l.] O’Reilly Media, Inc., 2015.